Fast Implementation of Density-Weighted Average Derivative Estimation
نویسنده
چکیده
Given random variables X ∈ IR and Y such that E[Y |X = x] = m(x), the average derivative δ0 is defined as δ0 = E[∇m(X)], i.e., as the expected value of the gradient of the regression function. Average derivative estimation has several applications in econometric theory (Stoker, 1992) and thus it is crucial to have a fast implementation of this estimator for practical purposes. We present such an implementation for a variation known as density-weighted average derivative estimation. This algorithm is based on the ideas of binning or Weighted Averaging of Rounded Points (WARPing). The basic idea of this method is to discretize the original data into a d-variate histogram and to replace in the nonparametric smoothing steps the actual observations by the appropriate bincenters. The non-parametric smoothing steps become thus a (multi-dimensional) convolution between the (discretized) data and the (discretized) smoothing kernel. A Monte-Carlo study demonstrates that with this binned implementation substantial reduction in computing time can be achieved. But it will also become clear that in higher dimension the choice of how to bin is crucial.
منابع مشابه
Linear Wavelet-Based Estimation for Derivative of a Density under Random Censorship
In this paper we consider estimation of the derivative of a density based on wavelets methods using randomly right censored data. We extend the results regarding the asymptotic convergence rates due to Prakasa Rao (1996) and Chaubey et al. (2008) under random censorship model. Our treatment is facilitated by results of Stute (1995) and Li (2003) that enable us in demonstrating that the same con...
متن کاملOn the estimation of density-weighted average derivative by wavelet methods under various dependence structures
The problem of estimating the density-weighted average derivative of a regression function is considered. We present a new consistent estimator based on a plug-in approach and wavelet projections. Its performances are explored under various dependence structures on the observations: the independent case, the ρ-mixing case and the α-mixing case. More precisely, denoting n the number of observati...
متن کاملNordsieck representation of high order predictor-corrector Obreshkov methods and their implementation
Predictor-corrector (PC) methods for the numerical solution of stiff ODEs can be extended to include the second derivative of the solution. In this paper, we consider second derivative PC methods with the three-step second derivative Adams-Bashforth as predictor and two-step second derivative Adams-Moulton as corrector which both methods have order six. Implementation of the proposed PC method ...
متن کاملUniform Convergence of Weighted Sums of Non- and Semi-parametric Residuals for Estimation and Testing∗
A new uniform expansion is introduced for sums of weighted kernel-based regression residuals from nonparametric or semiparametric models. This result is useful for deriving asymptotic properties of semiparametric estimators and test statistics with data-dependent bandwidth, random trimming, and estimated weights. An extension allows for generated regressors, without requiring the calculation of...
متن کاملWhat Do Kernel Density Estimators Optimize?
Some linkages between kernel and penalty methods of density estimation are explored. It is recalled that classical Gaussian kernel density estimation can be viewed as the solution of the heat equation with initial condition given by data. We then observe that there is a direct relationship between the kernel method and a particular penalty method of density estimation. For this penalty method, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007